Asymmetric kernel in Gaussian Processes for learning target variance
نویسندگان
چکیده
منابع مشابه
Asymmetric kernel in Gaussian Processes for learning target variance
This work incorporates the multi-modality of the data distribution into a Gaussian Process regression model. We approach the problem from a discriminative perspective by learning, jointly over the training data, the target space variance in the neighborhood of a certain sample through metric learning. We start by using data centers rather than all training samples. Subsequently, each center sel...
متن کاملAsymmetric Transfer Learning with Deep Gaussian Processes
We introduce a novel Gaussian process based Bayesian model for asymmetric transfer learning. We adopt a two-layer feed-forward deep Gaussian process as the task learner of source and target domains. The first layer projects the data onto a separate non-linear manifold for each task. We perform knowledge transfer by projecting the target data also onto the source domain and linearly combining it...
متن کاملKernel Distillation for Gaussian Processes
Gaussian processes (GPs) are flexible models that can capture complex structure in large-scale dataset due to their non-parametric nature. However, the usage of GPs in real-world application is limited due to their high computational cost at inference time. In this paper, we introduce a new framework, kernel distillation, for kernel matrix approximation. The idea adopts from knowledge distillat...
متن کاملMulti-Kernel Gaussian Processes
Although Gaussian process inference is usually formulated for a single output, in many machine learning problems the objective is to infer multiple tasks jointly, possibly exploring the dependencies between them to improve results. Real world examples of this problem include ore mining where the objective is to infer the concentration of several chemical components to assess the ore quality. Si...
متن کاملProduct Kernel Interpolation for Scalable Gaussian Processes
Recent work shows that inference for Gaussian processes can be performed efficiently using iterative methods that rely only on matrix-vector multiplications (MVMs). Structured Kernel Interpolation (SKI) exploits these techniques by deriving approximate kernels with very fast MVMs. Unfortunately, such strategies suffer badly from the curse of dimensionality. We develop a new technique for MVM ba...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Pattern Recognition Letters
سال: 2018
ISSN: 0167-8655
DOI: 10.1016/j.patrec.2018.02.026